Algorithmic Stability and Hypothesis Complexity

نویسندگان

  • Tongliang Liu
  • Gábor Lugosi
  • Gergely Neu
  • Dacheng Tao
چکیده

We introduce a notion of algorithmic stability of learning algorithms—that we term hypothesis stability—that captures stability of the hypothesis output by the learning algorithm in the normed space of functions from which hypotheses are selected. Œe main result of the paper bounds the generalization error of any learning algorithm in terms of its hypothesis stability. Œe bounds are based on martingale inequalities in the Banach space to which the hypotheses belong. We apply the general bounds to bound the performance of some learning algorithms based on empirical risk minimization and stochastic gradient descent. Parts of the work were done when Tongliang Liu was a visiting PhD student at Pompeu Fabra University. School of Information Technologies, Faculty Engineering and Information Technologies, University of Sydney, Sydney, Australia, [email protected], [email protected] Department of Economics and Business, Pompeu Fabra University, Barcelona, Spain, [email protected] ICREA, Pg. Llus Companys 23, 08010 Barcelona, Spain Barcelona Graduate School of Economics AI group, DTIC, Universitat Pompeu Fabra, Barcelona, Spain, [email protected] 1

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Algorithmic Stability and Uniform Generalization

One of the central questions in statistical learning theory is to determine the conditions under which agents can learn from experience. This includes the necessary and sufficient conditions for generalization from a given finite training set to new observations. In this paper, we prove that algorithmic stability in the inference process is equivalent to uniform generalization across all parame...

متن کامل

Generalization Bounds of Regularization Algorithms Derived Simultaneously through Hypothesis Space Complexity, Algorithmic Stability and Data Quality

A main issue in machine learning research is to analyze the generalization performance of a learning machine. Most classical results on the generalization performance of regularization algorithms are derived merely with the complexity of hypothesis space or the stability property of a learning algorithm. However, in practical applications, the performance of a learning algorithm is not actually...

متن کامل

Almost-everywhere Algorithmic Stability and Generalization Error

We introduce a new notion of algorithmic stability, which we call training stability. We show that training stability is sufficient for good bounds on generalization error. These bounds hold even when the learner has infinite VC dimension. In the PAC setting, training stability gives necessary and sufficient conditions for exponential convergence, and thus serves as a distribution-dependent ana...

متن کامل

The effect of increase in task cognitive complexity on Iranian EFL learners’ accuracy and linguistic complexity: A test of Robinson’s Cognition Hypothesis

Designing a task with a reasonable level of cognitive complexity has always been important for syllabus designers, teachers, as well as researchers. This is because task manipulation may lead to different results in oral production. The present study was an attempt to explore the effect of this  manipulation  -  based  on  Robinson’s  resource-directing  model  (reasoning  demands, number of el...

متن کامل

Algorithmic Stability and Generalization Performance

We present a novel way of obtaining PAC-style bounds on the generalization error of learning algorithms, explicitly using their stability properties. A stable learner is one for which the learned solution does not change much with small changes in the training set. The bounds we obtain do not depend on any measure of the complexity of the hypothesis space (e.g. VC dimension) but rather depend o...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017